[ Wed Sep 28 02:14:51 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:16:05 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/csub/fc_joint', 'model_saved_name': 'work_dir/ntu60/csub/fc_joint/runs', 'config': 'config/nturgbd-cross-subject/fc.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': False, 'bone': False}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': False, 'bone': False, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [0], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:16:05 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:16:05 2022 ] Training epoch: 1
[ Wed Sep 28 02:19:15 2022 ] 	Mean training loss: 2.5720. loss2: 0.0000. Mean training acc: 31.31%.
[ Wed Sep 28 02:19:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:19:15 2022 ] Eval epoch: 1
[ Wed Sep 28 02:19:46 2022 ] 	Mean test loss of 258 batches: 1.6300689028215039.
[ Wed Sep 28 02:19:46 2022 ] 	Top1: 51.30%
[ Wed Sep 28 02:19:46 2022 ] 	Top5: 87.16%
[ Wed Sep 28 02:19:46 2022 ] Training epoch: 2
[ Wed Sep 28 02:22:56 2022 ] 	Mean training loss: 1.5258. loss2: 0.0000. Mean training acc: 54.34%.
[ Wed Sep 28 02:22:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:22:56 2022 ] Eval epoch: 2
[ Wed Sep 28 02:23:26 2022 ] 	Mean test loss of 258 batches: 1.2952618076819782.
[ Wed Sep 28 02:23:26 2022 ] 	Top1: 61.52%
[ Wed Sep 28 02:23:26 2022 ] 	Top5: 90.59%
[ Wed Sep 28 02:23:26 2022 ] Training epoch: 3
[ Wed Sep 28 02:26:36 2022 ] 	Mean training loss: 1.1966. loss2: 0.0000. Mean training acc: 63.42%.
[ Wed Sep 28 02:26:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:26:36 2022 ] Eval epoch: 3
[ Wed Sep 28 02:27:06 2022 ] 	Mean test loss of 258 batches: 0.9974248795084251.
[ Wed Sep 28 02:27:06 2022 ] 	Top1: 70.35%
[ Wed Sep 28 02:27:06 2022 ] 	Top5: 93.09%
[ Wed Sep 28 02:27:06 2022 ] Training epoch: 4
[ Wed Sep 28 02:30:16 2022 ] 	Mean training loss: 1.0314. loss2: 0.0000. Mean training acc: 68.18%.
[ Wed Sep 28 02:30:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:30:16 2022 ] Eval epoch: 4
[ Wed Sep 28 02:30:46 2022 ] 	Mean test loss of 258 batches: 0.9495958794457043.
[ Wed Sep 28 02:30:46 2022 ] 	Top1: 71.59%
[ Wed Sep 28 02:30:46 2022 ] 	Top5: 93.72%
[ Wed Sep 28 02:30:46 2022 ] Training epoch: 5
[ Wed Sep 28 02:33:56 2022 ] 	Mean training loss: 0.9462. loss2: 0.0000. Mean training acc: 70.30%.
[ Wed Sep 28 02:33:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:33:56 2022 ] Eval epoch: 5
[ Wed Sep 28 02:34:26 2022 ] 	Mean test loss of 258 batches: 1.0080597706774408.
[ Wed Sep 28 02:34:26 2022 ] 	Top1: 70.12%
[ Wed Sep 28 02:34:26 2022 ] 	Top5: 93.33%
[ Wed Sep 28 02:34:26 2022 ] Training epoch: 6
[ Wed Sep 28 02:37:37 2022 ] 	Mean training loss: 0.8418. loss2: 0.0000. Mean training acc: 73.73%.
[ Wed Sep 28 02:37:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:37:37 2022 ] Eval epoch: 6
[ Wed Sep 28 02:38:07 2022 ] 	Mean test loss of 258 batches: 0.8290886155856673.
[ Wed Sep 28 02:38:07 2022 ] 	Top1: 75.34%
[ Wed Sep 28 02:38:07 2022 ] 	Top5: 94.72%
[ Wed Sep 28 02:38:07 2022 ] Training epoch: 7
[ Wed Sep 28 02:41:17 2022 ] 	Mean training loss: 0.7729. loss2: 0.0000. Mean training acc: 75.64%.
[ Wed Sep 28 02:41:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:41:17 2022 ] Eval epoch: 7
[ Wed Sep 28 02:41:47 2022 ] 	Mean test loss of 258 batches: 0.9545065930416418.
[ Wed Sep 28 02:41:47 2022 ] 	Top1: 71.38%
[ Wed Sep 28 02:41:47 2022 ] 	Top5: 94.73%
[ Wed Sep 28 02:41:47 2022 ] Training epoch: 8
[ Wed Sep 28 02:44:57 2022 ] 	Mean training loss: 0.7230. loss2: 0.0000. Mean training acc: 77.40%.
[ Wed Sep 28 02:44:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:44:57 2022 ] Eval epoch: 8
[ Wed Sep 28 02:45:27 2022 ] 	Mean test loss of 258 batches: 0.862116629763167.
[ Wed Sep 28 02:45:27 2022 ] 	Top1: 73.41%
[ Wed Sep 28 02:45:27 2022 ] 	Top5: 94.86%
[ Wed Sep 28 02:45:27 2022 ] Training epoch: 9
[ Wed Sep 28 02:48:38 2022 ] 	Mean training loss: 0.6915. loss2: 0.0000. Mean training acc: 78.05%.
[ Wed Sep 28 02:48:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:48:38 2022 ] Eval epoch: 9
[ Wed Sep 28 02:49:08 2022 ] 	Mean test loss of 258 batches: 0.7930059237766636.
[ Wed Sep 28 02:49:08 2022 ] 	Top1: 75.98%
[ Wed Sep 28 02:49:08 2022 ] 	Top5: 95.82%
[ Wed Sep 28 02:49:08 2022 ] Training epoch: 10
[ Wed Sep 28 02:52:18 2022 ] 	Mean training loss: 0.6717. loss2: 0.0000. Mean training acc: 78.73%.
[ Wed Sep 28 02:52:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:52:18 2022 ] Eval epoch: 10
[ Wed Sep 28 02:52:48 2022 ] 	Mean test loss of 258 batches: 0.8011297516582548.
[ Wed Sep 28 02:52:48 2022 ] 	Top1: 76.02%
[ Wed Sep 28 02:52:48 2022 ] 	Top5: 95.01%
[ Wed Sep 28 02:52:48 2022 ] Training epoch: 11
[ Wed Sep 28 02:55:59 2022 ] 	Mean training loss: 0.6496. loss2: 0.0000. Mean training acc: 79.40%.
[ Wed Sep 28 02:55:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:55:59 2022 ] Eval epoch: 11
[ Wed Sep 28 02:56:28 2022 ] 	Mean test loss of 258 batches: 0.8780879883110061.
[ Wed Sep 28 02:56:29 2022 ] 	Top1: 74.37%
[ Wed Sep 28 02:56:29 2022 ] 	Top5: 94.54%
[ Wed Sep 28 02:56:29 2022 ] Training epoch: 12
[ Wed Sep 28 02:59:39 2022 ] 	Mean training loss: 0.6477. loss2: 0.0000. Mean training acc: 79.67%.
[ Wed Sep 28 02:59:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:59:39 2022 ] Eval epoch: 12
[ Wed Sep 28 03:00:10 2022 ] 	Mean test loss of 258 batches: 0.743173500248628.
[ Wed Sep 28 03:00:10 2022 ] 	Top1: 77.98%
[ Wed Sep 28 03:00:10 2022 ] 	Top5: 96.25%
[ Wed Sep 28 03:00:10 2022 ] Training epoch: 13
[ Wed Sep 28 03:03:20 2022 ] 	Mean training loss: 0.6217. loss2: 0.0000. Mean training acc: 80.40%.
[ Wed Sep 28 03:03:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:03:20 2022 ] Eval epoch: 13
[ Wed Sep 28 03:03:50 2022 ] 	Mean test loss of 258 batches: 0.7868662198153578.
[ Wed Sep 28 03:03:51 2022 ] 	Top1: 77.31%
[ Wed Sep 28 03:03:51 2022 ] 	Top5: 96.22%
[ Wed Sep 28 03:03:51 2022 ] Training epoch: 14
[ Wed Sep 28 03:07:01 2022 ] 	Mean training loss: 0.6029. loss2: 0.0000. Mean training acc: 80.88%.
[ Wed Sep 28 03:07:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:07:01 2022 ] Eval epoch: 14
[ Wed Sep 28 03:07:31 2022 ] 	Mean test loss of 258 batches: 0.6818155712166498.
[ Wed Sep 28 03:07:31 2022 ] 	Top1: 79.47%
[ Wed Sep 28 03:07:31 2022 ] 	Top5: 96.62%
[ Wed Sep 28 03:07:31 2022 ] Training epoch: 15
[ Wed Sep 28 03:10:42 2022 ] 	Mean training loss: 0.5961. loss2: 0.0000. Mean training acc: 81.23%.
[ Wed Sep 28 03:10:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:10:42 2022 ] Eval epoch: 15
[ Wed Sep 28 03:11:12 2022 ] 	Mean test loss of 258 batches: 0.7680001462152762.
[ Wed Sep 28 03:11:12 2022 ] 	Top1: 77.27%
[ Wed Sep 28 03:11:12 2022 ] 	Top5: 95.48%
[ Wed Sep 28 03:11:12 2022 ] Training epoch: 16
[ Wed Sep 28 03:14:23 2022 ] 	Mean training loss: 0.5841. loss2: 0.0000. Mean training acc: 81.53%.
[ Wed Sep 28 03:14:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:14:23 2022 ] Eval epoch: 16
[ Wed Sep 28 03:14:53 2022 ] 	Mean test loss of 258 batches: 0.8036933888991674.
[ Wed Sep 28 03:14:53 2022 ] 	Top1: 76.11%
[ Wed Sep 28 03:14:54 2022 ] 	Top5: 95.55%
[ Wed Sep 28 03:14:54 2022 ] Training epoch: 17
[ Wed Sep 28 03:18:04 2022 ] 	Mean training loss: 0.5719. loss2: 0.0000. Mean training acc: 81.97%.
[ Wed Sep 28 03:18:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:18:04 2022 ] Eval epoch: 17
[ Wed Sep 28 03:18:34 2022 ] 	Mean test loss of 258 batches: 0.7066249854920447.
[ Wed Sep 28 03:18:34 2022 ] 	Top1: 79.02%
[ Wed Sep 28 03:18:35 2022 ] 	Top5: 96.21%
[ Wed Sep 28 03:18:35 2022 ] Training epoch: 18
[ Wed Sep 28 03:21:45 2022 ] 	Mean training loss: 0.5602. loss2: 0.0000. Mean training acc: 82.31%.
[ Wed Sep 28 03:21:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:21:45 2022 ] Eval epoch: 18
[ Wed Sep 28 03:22:15 2022 ] 	Mean test loss of 258 batches: 0.7538140722832014.
[ Wed Sep 28 03:22:15 2022 ] 	Top1: 77.60%
[ Wed Sep 28 03:22:16 2022 ] 	Top5: 95.61%
[ Wed Sep 28 03:22:16 2022 ] Training epoch: 19
[ Wed Sep 28 03:25:26 2022 ] 	Mean training loss: 0.5567. loss2: 0.0000. Mean training acc: 82.46%.
[ Wed Sep 28 03:25:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:25:26 2022 ] Eval epoch: 19
[ Wed Sep 28 03:25:56 2022 ] 	Mean test loss of 258 batches: 0.7441971269688865.
[ Wed Sep 28 03:25:56 2022 ] 	Top1: 78.15%
[ Wed Sep 28 03:25:56 2022 ] 	Top5: 95.93%
[ Wed Sep 28 03:25:56 2022 ] Training epoch: 20
[ Wed Sep 28 03:29:07 2022 ] 	Mean training loss: 0.5513. loss2: 0.0000. Mean training acc: 82.72%.
[ Wed Sep 28 03:29:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:29:07 2022 ] Eval epoch: 20
[ Wed Sep 28 03:29:37 2022 ] 	Mean test loss of 258 batches: 0.8001263995272244.
[ Wed Sep 28 03:29:37 2022 ] 	Top1: 76.68%
[ Wed Sep 28 03:29:37 2022 ] 	Top5: 95.01%
[ Wed Sep 28 03:29:37 2022 ] Training epoch: 21
[ Wed Sep 28 03:32:48 2022 ] 	Mean training loss: 0.5372. loss2: 0.0000. Mean training acc: 83.06%.
[ Wed Sep 28 03:32:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:32:48 2022 ] Eval epoch: 21
[ Wed Sep 28 03:33:18 2022 ] 	Mean test loss of 258 batches: 0.7744144912260448.
[ Wed Sep 28 03:33:18 2022 ] 	Top1: 77.90%
[ Wed Sep 28 03:33:18 2022 ] 	Top5: 96.02%
[ Wed Sep 28 03:33:18 2022 ] Training epoch: 22
[ Wed Sep 28 03:36:28 2022 ] 	Mean training loss: 0.5303. loss2: 0.0000. Mean training acc: 83.09%.
[ Wed Sep 28 03:36:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:36:28 2022 ] Eval epoch: 22
[ Wed Sep 28 03:36:58 2022 ] 	Mean test loss of 258 batches: 0.781179722196372.
[ Wed Sep 28 03:36:58 2022 ] 	Top1: 78.30%
[ Wed Sep 28 03:36:58 2022 ] 	Top5: 95.42%
[ Wed Sep 28 03:36:58 2022 ] Training epoch: 23
[ Wed Sep 28 03:40:09 2022 ] 	Mean training loss: 0.5301. loss2: 0.0000. Mean training acc: 83.29%.
[ Wed Sep 28 03:40:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:40:09 2022 ] Eval epoch: 23
[ Wed Sep 28 03:40:39 2022 ] 	Mean test loss of 258 batches: 0.8862640466107878.
[ Wed Sep 28 03:40:39 2022 ] 	Top1: 75.08%
[ Wed Sep 28 03:40:39 2022 ] 	Top5: 93.95%
[ Wed Sep 28 03:40:39 2022 ] Training epoch: 24
[ Wed Sep 28 03:43:49 2022 ] 	Mean training loss: 0.5267. loss2: 0.0000. Mean training acc: 83.26%.
[ Wed Sep 28 03:43:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:43:49 2022 ] Eval epoch: 24
[ Wed Sep 28 03:44:19 2022 ] 	Mean test loss of 258 batches: 0.7823873274894648.
[ Wed Sep 28 03:44:19 2022 ] 	Top1: 77.26%
[ Wed Sep 28 03:44:20 2022 ] 	Top5: 95.31%
[ Wed Sep 28 03:44:20 2022 ] Training epoch: 25
[ Wed Sep 28 03:47:30 2022 ] 	Mean training loss: 0.5177. loss2: 0.0000. Mean training acc: 83.73%.
[ Wed Sep 28 03:47:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:47:30 2022 ] Eval epoch: 25
[ Wed Sep 28 03:48:00 2022 ] 	Mean test loss of 258 batches: 0.810917711997217.
[ Wed Sep 28 03:48:00 2022 ] 	Top1: 76.45%
[ Wed Sep 28 03:48:00 2022 ] 	Top5: 95.63%
[ Wed Sep 28 03:48:00 2022 ] Training epoch: 26
[ Wed Sep 28 03:51:11 2022 ] 	Mean training loss: 0.5211. loss2: 0.0000. Mean training acc: 83.58%.
[ Wed Sep 28 03:51:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:51:11 2022 ] Eval epoch: 26
[ Wed Sep 28 03:51:41 2022 ] 	Mean test loss of 258 batches: 0.86904382486214.
[ Wed Sep 28 03:51:41 2022 ] 	Top1: 75.59%
[ Wed Sep 28 03:51:41 2022 ] 	Top5: 93.99%
[ Wed Sep 28 03:51:41 2022 ] Training epoch: 27
[ Wed Sep 28 03:54:51 2022 ] 	Mean training loss: 0.5129. loss2: 0.0000. Mean training acc: 83.85%.
[ Wed Sep 28 03:54:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:54:51 2022 ] Eval epoch: 27
[ Wed Sep 28 03:55:21 2022 ] 	Mean test loss of 258 batches: 0.6772212959652724.
[ Wed Sep 28 03:55:22 2022 ] 	Top1: 79.63%
[ Wed Sep 28 03:55:22 2022 ] 	Top5: 96.66%
[ Wed Sep 28 03:55:22 2022 ] Training epoch: 28
[ Wed Sep 28 03:58:32 2022 ] 	Mean training loss: 0.5119. loss2: 0.0000. Mean training acc: 83.79%.
[ Wed Sep 28 03:58:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:58:32 2022 ] Eval epoch: 28
[ Wed Sep 28 03:59:02 2022 ] 	Mean test loss of 258 batches: 0.7045101780993069.
[ Wed Sep 28 03:59:02 2022 ] 	Top1: 79.52%
[ Wed Sep 28 03:59:02 2022 ] 	Top5: 95.81%
[ Wed Sep 28 03:59:02 2022 ] Training epoch: 29
[ Wed Sep 28 04:02:13 2022 ] 	Mean training loss: 0.5135. loss2: 0.0000. Mean training acc: 83.59%.
[ Wed Sep 28 04:02:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:02:13 2022 ] Eval epoch: 29
[ Wed Sep 28 04:02:43 2022 ] 	Mean test loss of 258 batches: 0.7446456790663475.
[ Wed Sep 28 04:02:43 2022 ] 	Top1: 78.44%
[ Wed Sep 28 04:02:43 2022 ] 	Top5: 95.95%
[ Wed Sep 28 04:02:44 2022 ] Training epoch: 30
[ Wed Sep 28 04:05:54 2022 ] 	Mean training loss: 0.5081. loss2: 0.0000. Mean training acc: 83.66%.
[ Wed Sep 28 04:05:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:05:54 2022 ] Eval epoch: 30
[ Wed Sep 28 04:06:24 2022 ] 	Mean test loss of 258 batches: 0.6351659860490828.
[ Wed Sep 28 04:06:24 2022 ] 	Top1: 81.22%
[ Wed Sep 28 04:06:24 2022 ] 	Top5: 96.54%
[ Wed Sep 28 04:06:24 2022 ] Training epoch: 31
[ Wed Sep 28 04:09:34 2022 ] 	Mean training loss: 0.5020. loss2: 0.0000. Mean training acc: 84.03%.
[ Wed Sep 28 04:09:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:09:34 2022 ] Eval epoch: 31
[ Wed Sep 28 04:10:04 2022 ] 	Mean test loss of 258 batches: 0.9745526372693306.
[ Wed Sep 28 04:10:04 2022 ] 	Top1: 72.38%
[ Wed Sep 28 04:10:05 2022 ] 	Top5: 93.21%
[ Wed Sep 28 04:10:05 2022 ] Training epoch: 32
[ Wed Sep 28 04:13:15 2022 ] 	Mean training loss: 0.5053. loss2: 0.0000. Mean training acc: 84.02%.
[ Wed Sep 28 04:13:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:13:15 2022 ] Eval epoch: 32
[ Wed Sep 28 04:13:45 2022 ] 	Mean test loss of 258 batches: 0.7158956687695296.
[ Wed Sep 28 04:13:45 2022 ] 	Top1: 79.06%
[ Wed Sep 28 04:13:45 2022 ] 	Top5: 96.50%
[ Wed Sep 28 04:13:45 2022 ] Training epoch: 33
[ Wed Sep 28 04:16:56 2022 ] 	Mean training loss: 0.5064. loss2: 0.0000. Mean training acc: 84.03%.
[ Wed Sep 28 04:16:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:16:56 2022 ] Eval epoch: 33
[ Wed Sep 28 04:17:26 2022 ] 	Mean test loss of 258 batches: 0.9117347102525623.
[ Wed Sep 28 04:17:26 2022 ] 	Top1: 75.39%
[ Wed Sep 28 04:17:26 2022 ] 	Top5: 94.80%
[ Wed Sep 28 04:17:26 2022 ] Training epoch: 34
[ Wed Sep 28 04:20:36 2022 ] 	Mean training loss: 0.5025. loss2: 0.0000. Mean training acc: 84.10%.
[ Wed Sep 28 04:20:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:20:36 2022 ] Eval epoch: 34
[ Wed Sep 28 04:21:06 2022 ] 	Mean test loss of 258 batches: 0.7191981621837431.
[ Wed Sep 28 04:21:06 2022 ] 	Top1: 78.70%
[ Wed Sep 28 04:21:07 2022 ] 	Top5: 96.14%
[ Wed Sep 28 04:21:07 2022 ] Training epoch: 35
[ Wed Sep 28 04:24:17 2022 ] 	Mean training loss: 0.4945. loss2: 0.0000. Mean training acc: 84.36%.
[ Wed Sep 28 04:24:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:24:17 2022 ] Eval epoch: 35
[ Wed Sep 28 04:24:47 2022 ] 	Mean test loss of 258 batches: 0.6723006550193757.
[ Wed Sep 28 04:24:47 2022 ] 	Top1: 79.39%
[ Wed Sep 28 04:24:47 2022 ] 	Top5: 96.34%
[ Wed Sep 28 04:24:47 2022 ] Training epoch: 36
[ Wed Sep 28 04:27:57 2022 ] 	Mean training loss: 0.4956. loss2: 0.0000. Mean training acc: 84.25%.
[ Wed Sep 28 04:27:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:27:57 2022 ] Eval epoch: 36
[ Wed Sep 28 04:28:28 2022 ] 	Mean test loss of 258 batches: 0.6649421428063119.
[ Wed Sep 28 04:28:28 2022 ] 	Top1: 80.28%
[ Wed Sep 28 04:28:28 2022 ] 	Top5: 96.32%
[ Wed Sep 28 04:28:28 2022 ] Training epoch: 37
[ Wed Sep 28 04:31:39 2022 ] 	Mean training loss: 0.4998. loss2: 0.0000. Mean training acc: 84.04%.
[ Wed Sep 28 04:31:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:31:39 2022 ] Eval epoch: 37
[ Wed Sep 28 04:32:09 2022 ] 	Mean test loss of 258 batches: 0.6546458361107249.
[ Wed Sep 28 04:32:09 2022 ] 	Top1: 80.30%
[ Wed Sep 28 04:32:10 2022 ] 	Top5: 96.55%
[ Wed Sep 28 04:32:10 2022 ] Training epoch: 38
[ Wed Sep 28 04:35:20 2022 ] 	Mean training loss: 0.4936. loss2: 0.0000. Mean training acc: 84.27%.
[ Wed Sep 28 04:35:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:35:20 2022 ] Eval epoch: 38
[ Wed Sep 28 04:35:50 2022 ] 	Mean test loss of 258 batches: 0.6413131526043249.
[ Wed Sep 28 04:35:50 2022 ] 	Top1: 81.00%
[ Wed Sep 28 04:35:50 2022 ] 	Top5: 96.82%
[ Wed Sep 28 04:35:50 2022 ] Training epoch: 39
[ Wed Sep 28 04:39:01 2022 ] 	Mean training loss: 0.4918. loss2: 0.0000. Mean training acc: 84.25%.
[ Wed Sep 28 04:39:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:39:01 2022 ] Eval epoch: 39
[ Wed Sep 28 04:39:31 2022 ] 	Mean test loss of 258 batches: 0.661762843875922.
[ Wed Sep 28 04:39:31 2022 ] 	Top1: 80.22%
[ Wed Sep 28 04:39:31 2022 ] 	Top5: 96.63%
[ Wed Sep 28 04:39:31 2022 ] Training epoch: 40
[ Wed Sep 28 04:42:42 2022 ] 	Mean training loss: 0.4852. loss2: 0.0000. Mean training acc: 84.55%.
[ Wed Sep 28 04:42:42 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:42:42 2022 ] Eval epoch: 40
[ Wed Sep 28 04:43:12 2022 ] 	Mean test loss of 258 batches: 1.2011493439822234.
[ Wed Sep 28 04:43:12 2022 ] 	Top1: 68.11%
[ Wed Sep 28 04:43:12 2022 ] 	Top5: 90.73%
[ Wed Sep 28 04:43:12 2022 ] Training epoch: 41
[ Wed Sep 28 04:46:23 2022 ] 	Mean training loss: 0.4857. loss2: 0.0000. Mean training acc: 84.68%.
[ Wed Sep 28 04:46:23 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:46:23 2022 ] Eval epoch: 41
[ Wed Sep 28 04:46:53 2022 ] 	Mean test loss of 258 batches: 0.749880360078442.
[ Wed Sep 28 04:46:53 2022 ] 	Top1: 78.06%
[ Wed Sep 28 04:46:53 2022 ] 	Top5: 95.46%
[ Wed Sep 28 04:46:53 2022 ] Training epoch: 42
[ Wed Sep 28 04:50:04 2022 ] 	Mean training loss: 0.4770. loss2: 0.0000. Mean training acc: 84.83%.
[ Wed Sep 28 04:50:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:50:04 2022 ] Eval epoch: 42
[ Wed Sep 28 04:50:34 2022 ] 	Mean test loss of 258 batches: 0.6799992079767145.
[ Wed Sep 28 04:50:34 2022 ] 	Top1: 80.03%
[ Wed Sep 28 04:50:34 2022 ] 	Top5: 96.39%
[ Wed Sep 28 04:50:34 2022 ] Training epoch: 43
[ Wed Sep 28 04:53:45 2022 ] 	Mean training loss: 0.4858. loss2: 0.0000. Mean training acc: 84.52%.
[ Wed Sep 28 04:53:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:53:45 2022 ] Eval epoch: 43
[ Wed Sep 28 04:54:15 2022 ] 	Mean test loss of 258 batches: 0.9858882156915443.
[ Wed Sep 28 04:54:15 2022 ] 	Top1: 71.72%
[ Wed Sep 28 04:54:15 2022 ] 	Top5: 93.01%
[ Wed Sep 28 04:54:15 2022 ] Training epoch: 44
[ Wed Sep 28 04:57:26 2022 ] 	Mean training loss: 0.4815. loss2: 0.0000. Mean training acc: 84.80%.
[ Wed Sep 28 04:57:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:57:26 2022 ] Eval epoch: 44
[ Wed Sep 28 04:57:56 2022 ] 	Mean test loss of 258 batches: 0.8858001783143642.
[ Wed Sep 28 04:57:56 2022 ] 	Top1: 75.07%
[ Wed Sep 28 04:57:56 2022 ] 	Top5: 95.88%
[ Wed Sep 28 04:57:56 2022 ] Training epoch: 45
[ Wed Sep 28 05:01:06 2022 ] 	Mean training loss: 0.4811. loss2: 0.0000. Mean training acc: 84.71%.
[ Wed Sep 28 05:01:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:01:06 2022 ] Eval epoch: 45
[ Wed Sep 28 05:01:36 2022 ] 	Mean test loss of 258 batches: 0.650626307714355.
[ Wed Sep 28 05:01:36 2022 ] 	Top1: 80.34%
[ Wed Sep 28 05:01:37 2022 ] 	Top5: 96.01%
[ Wed Sep 28 05:01:37 2022 ] Training epoch: 46
[ Wed Sep 28 05:04:47 2022 ] 	Mean training loss: 0.4793. loss2: 0.0000. Mean training acc: 84.75%.
[ Wed Sep 28 05:04:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:04:47 2022 ] Eval epoch: 46
[ Wed Sep 28 05:05:17 2022 ] 	Mean test loss of 258 batches: 0.7057554379914158.
[ Wed Sep 28 05:05:17 2022 ] 	Top1: 79.43%
[ Wed Sep 28 05:05:17 2022 ] 	Top5: 96.26%
[ Wed Sep 28 05:05:17 2022 ] Training epoch: 47
[ Wed Sep 28 05:08:28 2022 ] 	Mean training loss: 0.4673. loss2: 0.0000. Mean training acc: 84.99%.
[ Wed Sep 28 05:08:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:08:28 2022 ] Eval epoch: 47
[ Wed Sep 28 05:08:58 2022 ] 	Mean test loss of 258 batches: 2.338464133961256.
[ Wed Sep 28 05:08:58 2022 ] 	Top1: 56.93%
[ Wed Sep 28 05:08:58 2022 ] 	Top5: 88.33%
[ Wed Sep 28 05:08:58 2022 ] Training epoch: 48
[ Wed Sep 28 05:12:09 2022 ] 	Mean training loss: 0.4803. loss2: 0.0000. Mean training acc: 84.78%.
[ Wed Sep 28 05:12:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:12:09 2022 ] Eval epoch: 48
[ Wed Sep 28 05:12:39 2022 ] 	Mean test loss of 258 batches: 0.9187106984530309.
[ Wed Sep 28 05:12:39 2022 ] 	Top1: 74.11%
[ Wed Sep 28 05:12:39 2022 ] 	Top5: 94.06%
[ Wed Sep 28 05:12:39 2022 ] Training epoch: 49
[ Wed Sep 28 05:15:49 2022 ] 	Mean training loss: 0.4878. loss2: 0.0000. Mean training acc: 84.52%.
[ Wed Sep 28 05:15:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:15:49 2022 ] Eval epoch: 49
[ Wed Sep 28 05:16:19 2022 ] 	Mean test loss of 258 batches: 0.6449852396351422.
[ Wed Sep 28 05:16:19 2022 ] 	Top1: 80.65%
[ Wed Sep 28 05:16:20 2022 ] 	Top5: 96.71%
[ Wed Sep 28 05:16:20 2022 ] Training epoch: 50
[ Wed Sep 28 05:19:30 2022 ] 	Mean training loss: 0.4747. loss2: 0.0000. Mean training acc: 84.75%.
[ Wed Sep 28 05:19:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:19:30 2022 ] Eval epoch: 50
[ Wed Sep 28 05:20:00 2022 ] 	Mean test loss of 258 batches: 0.6367888790230418.
[ Wed Sep 28 05:20:00 2022 ] 	Top1: 80.51%
[ Wed Sep 28 05:20:00 2022 ] 	Top5: 97.17%
[ Wed Sep 28 05:20:00 2022 ] Training epoch: 51
[ Wed Sep 28 05:23:11 2022 ] 	Mean training loss: 0.4803. loss2: 0.0000. Mean training acc: 84.66%.
[ Wed Sep 28 05:23:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:23:11 2022 ] Eval epoch: 51
[ Wed Sep 28 05:23:41 2022 ] 	Mean test loss of 258 batches: 0.7610338739184446.
[ Wed Sep 28 05:23:41 2022 ] 	Top1: 78.33%
[ Wed Sep 28 05:23:41 2022 ] 	Top5: 95.37%
[ Wed Sep 28 05:23:41 2022 ] Training epoch: 52
[ Wed Sep 28 05:26:51 2022 ] 	Mean training loss: 0.4766. loss2: 0.0000. Mean training acc: 84.86%.
[ Wed Sep 28 05:26:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:26:51 2022 ] Eval epoch: 52
[ Wed Sep 28 05:27:21 2022 ] 	Mean test loss of 258 batches: 0.7596102408313936.
[ Wed Sep 28 05:27:21 2022 ] 	Top1: 77.30%
[ Wed Sep 28 05:27:21 2022 ] 	Top5: 95.67%
[ Wed Sep 28 05:27:21 2022 ] Training epoch: 53
[ Wed Sep 28 05:30:32 2022 ] 	Mean training loss: 0.4703. loss2: 0.0000. Mean training acc: 84.89%.
[ Wed Sep 28 05:30:32 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:30:32 2022 ] Eval epoch: 53
[ Wed Sep 28 05:31:02 2022 ] 	Mean test loss of 258 batches: 0.9166607304598934.
[ Wed Sep 28 05:31:02 2022 ] 	Top1: 74.45%
[ Wed Sep 28 05:31:02 2022 ] 	Top5: 94.44%
[ Wed Sep 28 05:31:02 2022 ] Training epoch: 54
[ Wed Sep 28 05:34:13 2022 ] 	Mean training loss: 0.4668. loss2: 0.0000. Mean training acc: 85.11%.
[ Wed Sep 28 05:34:13 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:34:13 2022 ] Eval epoch: 54
[ Wed Sep 28 05:34:43 2022 ] 	Mean test loss of 258 batches: 0.7159772214963455.
[ Wed Sep 28 05:34:43 2022 ] 	Top1: 78.36%
[ Wed Sep 28 05:34:44 2022 ] 	Top5: 96.54%
[ Wed Sep 28 05:34:44 2022 ] Training epoch: 55
[ Wed Sep 28 05:37:54 2022 ] 	Mean training loss: 0.4743. loss2: 0.0000. Mean training acc: 85.08%.
[ Wed Sep 28 05:37:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:37:54 2022 ] Eval epoch: 55
[ Wed Sep 28 05:38:24 2022 ] 	Mean test loss of 258 batches: 0.6898485699365305.
[ Wed Sep 28 05:38:24 2022 ] 	Top1: 79.81%
[ Wed Sep 28 05:38:24 2022 ] 	Top5: 96.46%
[ Wed Sep 28 05:38:24 2022 ] Training epoch: 56
[ Wed Sep 28 05:41:35 2022 ] 	Mean training loss: 0.4755. loss2: 0.0000. Mean training acc: 84.80%.
[ Wed Sep 28 05:41:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:41:35 2022 ] Eval epoch: 56
[ Wed Sep 28 05:42:05 2022 ] 	Mean test loss of 258 batches: 1.1623910926571188.
[ Wed Sep 28 05:42:05 2022 ] 	Top1: 70.64%
[ Wed Sep 28 05:42:05 2022 ] 	Top5: 89.68%
[ Wed Sep 28 05:42:05 2022 ] Training epoch: 57
[ Wed Sep 28 05:45:16 2022 ] 	Mean training loss: 0.4692. loss2: 0.0000. Mean training acc: 85.12%.
[ Wed Sep 28 05:45:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:45:16 2022 ] Eval epoch: 57
[ Wed Sep 28 05:45:46 2022 ] 	Mean test loss of 258 batches: 0.781904109919718.
[ Wed Sep 28 05:45:46 2022 ] 	Top1: 76.71%
[ Wed Sep 28 05:45:46 2022 ] 	Top5: 95.54%
[ Wed Sep 28 05:45:46 2022 ] Training epoch: 58
[ Wed Sep 28 05:48:57 2022 ] 	Mean training loss: 0.4634. loss2: 0.0000. Mean training acc: 85.18%.
[ Wed Sep 28 05:48:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:48:57 2022 ] Eval epoch: 58
[ Wed Sep 28 05:49:27 2022 ] 	Mean test loss of 258 batches: 0.7871425207271132.
[ Wed Sep 28 05:49:27 2022 ] 	Top1: 76.80%
[ Wed Sep 28 05:49:27 2022 ] 	Top5: 95.46%
[ Wed Sep 28 05:49:27 2022 ] Training epoch: 59
[ Wed Sep 28 05:52:38 2022 ] 	Mean training loss: 0.4691. loss2: 0.0000. Mean training acc: 84.99%.
[ Wed Sep 28 05:52:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:52:38 2022 ] Eval epoch: 59
[ Wed Sep 28 05:53:08 2022 ] 	Mean test loss of 258 batches: 0.6177894268329291.
[ Wed Sep 28 05:53:08 2022 ] 	Top1: 81.22%
[ Wed Sep 28 05:53:08 2022 ] 	Top5: 96.96%
[ Wed Sep 28 05:53:08 2022 ] Training epoch: 60
[ Wed Sep 28 05:56:18 2022 ] 	Mean training loss: 0.4695. loss2: 0.0000. Mean training acc: 85.20%.
[ Wed Sep 28 05:56:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:56:19 2022 ] Eval epoch: 60
[ Wed Sep 28 05:56:49 2022 ] 	Mean test loss of 258 batches: 0.6077042976899665.
[ Wed Sep 28 05:56:49 2022 ] 	Top1: 81.89%
[ Wed Sep 28 05:56:49 2022 ] 	Top5: 96.99%
[ Wed Sep 28 05:56:49 2022 ] Training epoch: 61
[ Wed Sep 28 05:59:59 2022 ] 	Mean training loss: 0.4640. loss2: 0.0000. Mean training acc: 85.15%.
[ Wed Sep 28 05:59:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:00:00 2022 ] Eval epoch: 61
[ Wed Sep 28 06:00:30 2022 ] 	Mean test loss of 258 batches: 0.8297493029241414.
[ Wed Sep 28 06:00:30 2022 ] 	Top1: 75.79%
[ Wed Sep 28 06:00:30 2022 ] 	Top5: 94.95%
[ Wed Sep 28 06:00:30 2022 ] Training epoch: 62
[ Wed Sep 28 06:03:41 2022 ] 	Mean training loss: 0.4553. loss2: 0.0000. Mean training acc: 85.43%.
[ Wed Sep 28 06:03:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:03:41 2022 ] Eval epoch: 62
[ Wed Sep 28 06:04:11 2022 ] 	Mean test loss of 258 batches: 0.6756045178387516.
[ Wed Sep 28 06:04:11 2022 ] 	Top1: 79.43%
[ Wed Sep 28 06:04:11 2022 ] 	Top5: 96.63%
[ Wed Sep 28 06:04:11 2022 ] Training epoch: 63
[ Wed Sep 28 06:07:21 2022 ] 	Mean training loss: 0.4682. loss2: 0.0000. Mean training acc: 85.21%.
[ Wed Sep 28 06:07:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:07:21 2022 ] Eval epoch: 63
[ Wed Sep 28 06:07:51 2022 ] 	Mean test loss of 258 batches: 0.8620480220447215.
[ Wed Sep 28 06:07:51 2022 ] 	Top1: 75.36%
[ Wed Sep 28 06:07:51 2022 ] 	Top5: 95.22%
[ Wed Sep 28 06:07:52 2022 ] Training epoch: 64
[ Wed Sep 28 06:11:02 2022 ] 	Mean training loss: 0.4677. loss2: 0.0000. Mean training acc: 85.07%.
[ Wed Sep 28 06:11:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:11:02 2022 ] Eval epoch: 64
[ Wed Sep 28 06:11:32 2022 ] 	Mean test loss of 258 batches: 0.6083020170529684.
[ Wed Sep 28 06:11:32 2022 ] 	Top1: 81.94%
[ Wed Sep 28 06:11:32 2022 ] 	Top5: 96.98%
[ Wed Sep 28 06:11:32 2022 ] Training epoch: 65
[ Wed Sep 28 06:14:43 2022 ] 	Mean training loss: 0.4591. loss2: 0.0000. Mean training acc: 85.30%.
[ Wed Sep 28 06:14:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:14:43 2022 ] Eval epoch: 65
[ Wed Sep 28 06:15:14 2022 ] 	Mean test loss of 258 batches: 0.6339124695275181.
[ Wed Sep 28 06:15:14 2022 ] 	Top1: 80.85%
[ Wed Sep 28 06:15:14 2022 ] 	Top5: 96.64%
[ Wed Sep 28 06:15:14 2022 ] Training epoch: 66
[ Wed Sep 28 06:18:25 2022 ] 	Mean training loss: 0.4750. loss2: 0.0000. Mean training acc: 84.97%.
[ Wed Sep 28 06:18:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:18:25 2022 ] Eval epoch: 66
[ Wed Sep 28 06:18:55 2022 ] 	Mean test loss of 258 batches: 0.7304151469422865.
[ Wed Sep 28 06:18:56 2022 ] 	Top1: 78.87%
[ Wed Sep 28 06:18:56 2022 ] 	Top5: 95.34%
[ Wed Sep 28 06:18:56 2022 ] Training epoch: 67
[ Wed Sep 28 06:22:06 2022 ] 	Mean training loss: 0.4661. loss2: 0.0000. Mean training acc: 85.28%.
[ Wed Sep 28 06:22:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:22:06 2022 ] Eval epoch: 67
[ Wed Sep 28 06:22:36 2022 ] 	Mean test loss of 258 batches: 0.7260417755722075.
[ Wed Sep 28 06:22:36 2022 ] 	Top1: 79.12%
[ Wed Sep 28 06:22:36 2022 ] 	Top5: 95.68%
[ Wed Sep 28 06:22:37 2022 ] Training epoch: 68
[ Wed Sep 28 06:25:47 2022 ] 	Mean training loss: 0.4580. loss2: 0.0000. Mean training acc: 85.50%.
[ Wed Sep 28 06:25:47 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:25:47 2022 ] Eval epoch: 68
[ Wed Sep 28 06:26:17 2022 ] 	Mean test loss of 258 batches: 0.8767418641914693.
[ Wed Sep 28 06:26:17 2022 ] 	Top1: 74.59%
[ Wed Sep 28 06:26:17 2022 ] 	Top5: 94.69%
[ Wed Sep 28 06:26:17 2022 ] Training epoch: 69
[ Wed Sep 28 06:29:28 2022 ] 	Mean training loss: 0.4699. loss2: 0.0000. Mean training acc: 85.24%.
[ Wed Sep 28 06:29:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:29:28 2022 ] Eval epoch: 69
[ Wed Sep 28 06:29:58 2022 ] 	Mean test loss of 258 batches: 0.8756725506265034.
[ Wed Sep 28 06:29:58 2022 ] 	Top1: 75.34%
[ Wed Sep 28 06:29:58 2022 ] 	Top5: 95.21%
[ Wed Sep 28 06:29:58 2022 ] Training epoch: 70
[ Wed Sep 28 06:33:08 2022 ] 	Mean training loss: 0.4585. loss2: 0.0000. Mean training acc: 85.51%.
[ Wed Sep 28 06:33:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:33:09 2022 ] Eval epoch: 70
[ Wed Sep 28 06:33:39 2022 ] 	Mean test loss of 258 batches: 0.5853706820066585.
[ Wed Sep 28 06:33:39 2022 ] 	Top1: 82.18%
[ Wed Sep 28 06:33:39 2022 ] 	Top5: 97.09%
[ Wed Sep 28 06:33:39 2022 ] Training epoch: 71
[ Wed Sep 28 06:36:49 2022 ] 	Mean training loss: 0.4606. loss2: 0.0000. Mean training acc: 85.39%.
[ Wed Sep 28 06:36:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:36:49 2022 ] Eval epoch: 71
[ Wed Sep 28 06:37:19 2022 ] 	Mean test loss of 258 batches: 0.6576453713021537.
[ Wed Sep 28 06:37:20 2022 ] 	Top1: 80.33%
[ Wed Sep 28 06:37:20 2022 ] 	Top5: 96.12%
[ Wed Sep 28 06:37:20 2022 ] Training epoch: 72
[ Wed Sep 28 06:40:30 2022 ] 	Mean training loss: 0.4578. loss2: 0.0000. Mean training acc: 85.34%.
[ Wed Sep 28 06:40:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:40:30 2022 ] Eval epoch: 72
[ Wed Sep 28 06:41:00 2022 ] 	Mean test loss of 258 batches: 0.7448714043973952.
[ Wed Sep 28 06:41:00 2022 ] 	Top1: 79.44%
[ Wed Sep 28 06:41:00 2022 ] 	Top5: 95.35%
[ Wed Sep 28 06:41:00 2022 ] Training epoch: 73
[ Wed Sep 28 06:44:11 2022 ] 	Mean training loss: 0.4644. loss2: 0.0000. Mean training acc: 85.30%.
[ Wed Sep 28 06:44:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:44:11 2022 ] Eval epoch: 73
[ Wed Sep 28 06:44:41 2022 ] 	Mean test loss of 258 batches: 1.2604422094516976.
[ Wed Sep 28 06:44:41 2022 ] 	Top1: 67.55%
[ Wed Sep 28 06:44:41 2022 ] 	Top5: 90.08%
[ Wed Sep 28 06:44:41 2022 ] Training epoch: 74
[ Wed Sep 28 06:47:52 2022 ] 	Mean training loss: 0.4607. loss2: 0.0000. Mean training acc: 85.40%.
[ Wed Sep 28 06:47:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:47:52 2022 ] Eval epoch: 74
[ Wed Sep 28 06:48:22 2022 ] 	Mean test loss of 258 batches: 0.8043339406104051.
[ Wed Sep 28 06:48:22 2022 ] 	Top1: 76.94%
[ Wed Sep 28 06:48:22 2022 ] 	Top5: 94.64%
[ Wed Sep 28 06:48:22 2022 ] Training epoch: 75
[ Wed Sep 28 06:51:33 2022 ] 	Mean training loss: 0.4588. loss2: 0.0000. Mean training acc: 85.37%.
[ Wed Sep 28 06:51:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:51:33 2022 ] Eval epoch: 75
[ Wed Sep 28 06:52:03 2022 ] 	Mean test loss of 258 batches: 0.6654719317606254.
[ Wed Sep 28 06:52:04 2022 ] 	Top1: 80.48%
[ Wed Sep 28 06:52:04 2022 ] 	Top5: 96.01%
[ Wed Sep 28 06:52:04 2022 ] Training epoch: 76
[ Wed Sep 28 06:55:14 2022 ] 	Mean training loss: 0.4607. loss2: 0.0000. Mean training acc: 85.41%.
[ Wed Sep 28 06:55:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:55:14 2022 ] Eval epoch: 76
[ Wed Sep 28 06:55:44 2022 ] 	Mean test loss of 258 batches: 0.6788230518731035.
[ Wed Sep 28 06:55:44 2022 ] 	Top1: 79.49%
[ Wed Sep 28 06:55:44 2022 ] 	Top5: 96.28%
[ Wed Sep 28 06:55:44 2022 ] Training epoch: 77
[ Wed Sep 28 06:58:55 2022 ] 	Mean training loss: 0.4661. loss2: 0.0000. Mean training acc: 85.14%.
[ Wed Sep 28 06:58:55 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:58:55 2022 ] Eval epoch: 77
[ Wed Sep 28 06:59:25 2022 ] 	Mean test loss of 258 batches: 0.7255097535807032.
[ Wed Sep 28 06:59:25 2022 ] 	Top1: 79.50%
[ Wed Sep 28 06:59:25 2022 ] 	Top5: 95.35%
[ Wed Sep 28 06:59:25 2022 ] Training epoch: 78
[ Wed Sep 28 07:02:35 2022 ] 	Mean training loss: 0.4631. loss2: 0.0000. Mean training acc: 85.33%.
[ Wed Sep 28 07:02:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:02:35 2022 ] Eval epoch: 78
[ Wed Sep 28 07:03:05 2022 ] 	Mean test loss of 258 batches: 1.0279506017533384.
[ Wed Sep 28 07:03:05 2022 ] 	Top1: 70.96%
[ Wed Sep 28 07:03:06 2022 ] 	Top5: 93.12%
[ Wed Sep 28 07:03:06 2022 ] Training epoch: 79
[ Wed Sep 28 07:06:16 2022 ] 	Mean training loss: 0.4594. loss2: 0.0000. Mean training acc: 85.57%.
[ Wed Sep 28 07:06:16 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:06:16 2022 ] Eval epoch: 79
[ Wed Sep 28 07:06:46 2022 ] 	Mean test loss of 258 batches: 1.1496756641670716.
[ Wed Sep 28 07:06:46 2022 ] 	Top1: 69.70%
[ Wed Sep 28 07:06:47 2022 ] 	Top5: 91.68%
[ Wed Sep 28 07:06:47 2022 ] Training epoch: 80
[ Wed Sep 28 07:09:57 2022 ] 	Mean training loss: 0.4581. loss2: 0.0000. Mean training acc: 85.50%.
[ Wed Sep 28 07:09:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:09:57 2022 ] Eval epoch: 80
[ Wed Sep 28 07:10:27 2022 ] 	Mean test loss of 258 batches: 0.7138810677583828.
[ Wed Sep 28 07:10:27 2022 ] 	Top1: 79.49%
[ Wed Sep 28 07:10:27 2022 ] 	Top5: 95.80%
[ Wed Sep 28 07:10:27 2022 ] Training epoch: 81
[ Wed Sep 28 07:13:38 2022 ] 	Mean training loss: 0.4493. loss2: 0.0000. Mean training acc: 85.88%.
[ Wed Sep 28 07:13:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:13:38 2022 ] Eval epoch: 81
[ Wed Sep 28 07:14:08 2022 ] 	Mean test loss of 258 batches: 0.7453505483477615.
[ Wed Sep 28 07:14:08 2022 ] 	Top1: 78.71%
[ Wed Sep 28 07:14:08 2022 ] 	Top5: 95.85%
[ Wed Sep 28 07:14:08 2022 ] Training epoch: 82
[ Wed Sep 28 07:17:18 2022 ] 	Mean training loss: 0.4594. loss2: 0.0000. Mean training acc: 85.37%.
[ Wed Sep 28 07:17:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:17:19 2022 ] Eval epoch: 82
[ Wed Sep 28 07:17:49 2022 ] 	Mean test loss of 258 batches: 0.9092854892098626.
[ Wed Sep 28 07:17:49 2022 ] 	Top1: 74.99%
[ Wed Sep 28 07:17:49 2022 ] 	Top5: 94.69%
[ Wed Sep 28 07:17:49 2022 ] Training epoch: 83
[ Wed Sep 28 07:20:59 2022 ] 	Mean training loss: 0.4581. loss2: 0.0000. Mean training acc: 85.45%.
[ Wed Sep 28 07:20:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:20:59 2022 ] Eval epoch: 83
[ Wed Sep 28 07:21:29 2022 ] 	Mean test loss of 258 batches: 0.6310882867537728.
[ Wed Sep 28 07:21:30 2022 ] 	Top1: 81.08%
[ Wed Sep 28 07:21:30 2022 ] 	Top5: 96.69%
[ Wed Sep 28 07:21:30 2022 ] Training epoch: 84
[ Wed Sep 28 07:24:40 2022 ] 	Mean training loss: 0.4617. loss2: 0.0000. Mean training acc: 85.50%.
[ Wed Sep 28 07:24:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:24:40 2022 ] Eval epoch: 84
[ Wed Sep 28 07:25:10 2022 ] 	Mean test loss of 258 batches: 0.6077249590163083.
[ Wed Sep 28 07:25:10 2022 ] 	Top1: 81.56%
[ Wed Sep 28 07:25:10 2022 ] 	Top5: 96.99%
[ Wed Sep 28 07:25:10 2022 ] Training epoch: 85
[ Wed Sep 28 07:28:21 2022 ] 	Mean training loss: 0.4541. loss2: 0.0000. Mean training acc: 85.48%.
[ Wed Sep 28 07:28:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:28:21 2022 ] Eval epoch: 85
[ Wed Sep 28 07:28:51 2022 ] 	Mean test loss of 258 batches: 0.7493631237930105.
[ Wed Sep 28 07:28:51 2022 ] 	Top1: 78.43%
[ Wed Sep 28 07:28:51 2022 ] 	Top5: 95.78%
[ Wed Sep 28 07:28:51 2022 ] Training epoch: 86
[ Wed Sep 28 07:32:03 2022 ] 	Mean training loss: 0.4574. loss2: 0.0000. Mean training acc: 85.47%.
[ Wed Sep 28 07:32:03 2022 ] 	Time consumption: [Data]02%, [Network]97%
[ Wed Sep 28 07:32:03 2022 ] Eval epoch: 86
[ Wed Sep 28 07:32:33 2022 ] 	Mean test loss of 258 batches: 0.6439498060548953.
[ Wed Sep 28 07:32:33 2022 ] 	Top1: 80.90%
[ Wed Sep 28 07:32:33 2022 ] 	Top5: 96.74%
[ Wed Sep 28 07:32:33 2022 ] Training epoch: 87
[ Wed Sep 28 07:35:44 2022 ] 	Mean training loss: 0.4499. loss2: 0.0000. Mean training acc: 85.73%.
[ Wed Sep 28 07:35:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:35:44 2022 ] Eval epoch: 87
[ Wed Sep 28 07:36:14 2022 ] 	Mean test loss of 258 batches: 0.6840045575139134.
[ Wed Sep 28 07:36:14 2022 ] 	Top1: 79.71%
[ Wed Sep 28 07:36:14 2022 ] 	Top5: 96.16%
[ Wed Sep 28 07:36:14 2022 ] Training epoch: 88
[ Wed Sep 28 07:39:24 2022 ] 	Mean training loss: 0.4549. loss2: 0.0000. Mean training acc: 85.59%.
[ Wed Sep 28 07:39:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:39:24 2022 ] Eval epoch: 88
[ Wed Sep 28 07:39:54 2022 ] 	Mean test loss of 258 batches: 0.7979201421488163.
[ Wed Sep 28 07:39:55 2022 ] 	Top1: 78.21%
[ Wed Sep 28 07:39:55 2022 ] 	Top5: 94.47%
[ Wed Sep 28 07:39:55 2022 ] Training epoch: 89
[ Wed Sep 28 07:43:06 2022 ] 	Mean training loss: 0.4595. loss2: 0.0000. Mean training acc: 85.23%.
[ Wed Sep 28 07:43:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:43:06 2022 ] Eval epoch: 89
[ Wed Sep 28 07:43:36 2022 ] 	Mean test loss of 258 batches: 0.7626535204607386.
[ Wed Sep 28 07:43:36 2022 ] 	Top1: 78.20%
[ Wed Sep 28 07:43:36 2022 ] 	Top5: 95.77%
[ Wed Sep 28 07:43:36 2022 ] Training epoch: 90
[ Wed Sep 28 07:46:46 2022 ] 	Mean training loss: 0.4544. loss2: 0.0000. Mean training acc: 85.63%.
[ Wed Sep 28 07:46:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:46:47 2022 ] Eval epoch: 90
[ Wed Sep 28 07:47:16 2022 ] 	Mean test loss of 258 batches: 0.6349558849320855.
[ Wed Sep 28 07:47:17 2022 ] 	Top1: 81.09%
[ Wed Sep 28 07:47:17 2022 ] 	Top5: 96.90%
[ Wed Sep 28 07:47:17 2022 ] Training epoch: 91
[ Wed Sep 28 07:50:28 2022 ] 	Mean training loss: 0.2671. loss2: 0.0000. Mean training acc: 91.74%.
[ Wed Sep 28 07:50:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:50:28 2022 ] Eval epoch: 91
[ Wed Sep 28 07:50:58 2022 ] 	Mean test loss of 258 batches: 0.3586061628058899.
[ Wed Sep 28 07:50:58 2022 ] 	Top1: 89.02%
[ Wed Sep 28 07:50:58 2022 ] 	Top5: 98.28%
[ Wed Sep 28 07:50:58 2022 ] Training epoch: 92
[ Wed Sep 28 07:54:08 2022 ] 	Mean training loss: 0.2137. loss2: 0.0000. Mean training acc: 93.44%.
[ Wed Sep 28 07:54:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:54:08 2022 ] Eval epoch: 92
[ Wed Sep 28 07:54:38 2022 ] 	Mean test loss of 258 batches: 0.34581895810804625.
[ Wed Sep 28 07:54:38 2022 ] 	Top1: 89.31%
[ Wed Sep 28 07:54:39 2022 ] 	Top5: 98.31%
[ Wed Sep 28 07:54:39 2022 ] Training epoch: 93
[ Wed Sep 28 07:57:49 2022 ] 	Mean training loss: 0.1892. loss2: 0.0000. Mean training acc: 94.25%.
[ Wed Sep 28 07:57:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:57:49 2022 ] Eval epoch: 93
[ Wed Sep 28 07:58:19 2022 ] 	Mean test loss of 258 batches: 0.3481176186266334.
[ Wed Sep 28 07:58:19 2022 ] 	Top1: 89.51%
[ Wed Sep 28 07:58:20 2022 ] 	Top5: 98.34%
[ Wed Sep 28 07:58:20 2022 ] Training epoch: 94
[ Wed Sep 28 08:01:31 2022 ] 	Mean training loss: 0.1719. loss2: 0.0000. Mean training acc: 94.74%.
[ Wed Sep 28 08:01:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:01:31 2022 ] Eval epoch: 94
[ Wed Sep 28 08:02:01 2022 ] 	Mean test loss of 258 batches: 0.34647642096230225.
[ Wed Sep 28 08:02:01 2022 ] 	Top1: 89.70%
[ Wed Sep 28 08:02:01 2022 ] 	Top5: 98.26%
[ Wed Sep 28 08:02:01 2022 ] Training epoch: 95
[ Wed Sep 28 08:05:12 2022 ] 	Mean training loss: 0.1615. loss2: 0.0000. Mean training acc: 95.08%.
[ Wed Sep 28 08:05:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:05:12 2022 ] Eval epoch: 95
[ Wed Sep 28 08:05:42 2022 ] 	Mean test loss of 258 batches: 0.35674786625444427.
[ Wed Sep 28 08:05:42 2022 ] 	Top1: 89.29%
[ Wed Sep 28 08:05:42 2022 ] 	Top5: 98.17%
[ Wed Sep 28 08:05:42 2022 ] Training epoch: 96
[ Wed Sep 28 08:08:52 2022 ] 	Mean training loss: 0.1495. loss2: 0.0000. Mean training acc: 95.40%.
[ Wed Sep 28 08:08:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:08:52 2022 ] Eval epoch: 96
[ Wed Sep 28 08:09:22 2022 ] 	Mean test loss of 258 batches: 0.35404262811580833.
[ Wed Sep 28 08:09:22 2022 ] 	Top1: 89.52%
[ Wed Sep 28 08:09:23 2022 ] 	Top5: 98.28%
[ Wed Sep 28 08:09:23 2022 ] Training epoch: 97
[ Wed Sep 28 08:12:33 2022 ] 	Mean training loss: 0.1414. loss2: 0.0000. Mean training acc: 95.78%.
[ Wed Sep 28 08:12:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:12:33 2022 ] Eval epoch: 97
[ Wed Sep 28 08:13:03 2022 ] 	Mean test loss of 258 batches: 0.35314213600760513.
[ Wed Sep 28 08:13:03 2022 ] 	Top1: 89.51%
[ Wed Sep 28 08:13:03 2022 ] 	Top5: 98.28%
[ Wed Sep 28 08:13:03 2022 ] Training epoch: 98
[ Wed Sep 28 08:16:14 2022 ] 	Mean training loss: 0.1359. loss2: 0.0000. Mean training acc: 96.01%.
[ Wed Sep 28 08:16:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:16:14 2022 ] Eval epoch: 98
[ Wed Sep 28 08:16:44 2022 ] 	Mean test loss of 258 batches: 0.35248489677906036.
[ Wed Sep 28 08:16:44 2022 ] 	Top1: 89.56%
[ Wed Sep 28 08:16:44 2022 ] 	Top5: 98.33%
[ Wed Sep 28 08:16:44 2022 ] Training epoch: 99
[ Wed Sep 28 08:19:54 2022 ] 	Mean training loss: 0.1286. loss2: 0.0000. Mean training acc: 96.36%.
[ Wed Sep 28 08:19:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:19:54 2022 ] Eval epoch: 99
[ Wed Sep 28 08:20:24 2022 ] 	Mean test loss of 258 batches: 0.35595607697646053.
[ Wed Sep 28 08:20:24 2022 ] 	Top1: 89.49%
[ Wed Sep 28 08:20:24 2022 ] 	Top5: 98.34%
[ Wed Sep 28 08:20:24 2022 ] Training epoch: 100
[ Wed Sep 28 08:23:35 2022 ] 	Mean training loss: 0.1200. loss2: 0.0000. Mean training acc: 96.55%.
[ Wed Sep 28 08:23:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:23:35 2022 ] Eval epoch: 100
[ Wed Sep 28 08:24:05 2022 ] 	Mean test loss of 258 batches: 0.3583983797724395.
[ Wed Sep 28 08:24:05 2022 ] 	Top1: 89.55%
[ Wed Sep 28 08:24:05 2022 ] 	Top5: 98.22%
[ Wed Sep 28 08:24:05 2022 ] Training epoch: 101
[ Wed Sep 28 08:27:15 2022 ] 	Mean training loss: 0.0976. loss2: 0.0000. Mean training acc: 97.37%.
[ Wed Sep 28 08:27:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:27:16 2022 ] Eval epoch: 101
[ Wed Sep 28 08:27:46 2022 ] 	Mean test loss of 258 batches: 0.3438876838632679.
[ Wed Sep 28 08:27:46 2022 ] 	Top1: 90.03%
[ Wed Sep 28 08:27:46 2022 ] 	Top5: 98.27%
[ Wed Sep 28 08:27:46 2022 ] Training epoch: 102
[ Wed Sep 28 08:30:56 2022 ] 	Mean training loss: 0.0910. loss2: 0.0000. Mean training acc: 97.49%.
[ Wed Sep 28 08:30:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:30:56 2022 ] Eval epoch: 102
[ Wed Sep 28 08:31:26 2022 ] 	Mean test loss of 258 batches: 0.34240031422859474.
[ Wed Sep 28 08:31:26 2022 ] 	Top1: 90.05%
[ Wed Sep 28 08:31:26 2022 ] 	Top5: 98.30%
[ Wed Sep 28 08:31:26 2022 ] Training epoch: 103
[ Wed Sep 28 08:34:37 2022 ] 	Mean training loss: 0.0836. loss2: 0.0000. Mean training acc: 97.83%.
[ Wed Sep 28 08:34:37 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:34:37 2022 ] Eval epoch: 103
[ Wed Sep 28 08:35:07 2022 ] 	Mean test loss of 258 batches: 0.34342448699627387.
[ Wed Sep 28 08:35:07 2022 ] 	Top1: 90.00%
[ Wed Sep 28 08:35:07 2022 ] 	Top5: 98.27%
[ Wed Sep 28 08:35:07 2022 ] Training epoch: 104
[ Wed Sep 28 08:38:17 2022 ] 	Mean training loss: 0.0815. loss2: 0.0000. Mean training acc: 97.94%.
[ Wed Sep 28 08:38:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:38:17 2022 ] Eval epoch: 104
[ Wed Sep 28 08:38:47 2022 ] 	Mean test loss of 258 batches: 0.3474424598998455.
[ Wed Sep 28 08:38:47 2022 ] 	Top1: 90.14%
[ Wed Sep 28 08:38:48 2022 ] 	Top5: 98.26%
[ Wed Sep 28 08:38:48 2022 ] Training epoch: 105
[ Wed Sep 28 08:41:58 2022 ] 	Mean training loss: 0.0790. loss2: 0.0000. Mean training acc: 98.03%.
[ Wed Sep 28 08:41:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:41:58 2022 ] Eval epoch: 105
[ Wed Sep 28 08:42:28 2022 ] 	Mean test loss of 258 batches: 0.35050288549905945.
[ Wed Sep 28 08:42:28 2022 ] 	Top1: 89.87%
[ Wed Sep 28 08:42:28 2022 ] 	Top5: 98.24%
[ Wed Sep 28 08:42:28 2022 ] Training epoch: 106
[ Wed Sep 28 08:45:38 2022 ] 	Mean training loss: 0.0775. loss2: 0.0000. Mean training acc: 98.09%.
[ Wed Sep 28 08:45:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:45:39 2022 ] Eval epoch: 106
[ Wed Sep 28 08:46:08 2022 ] 	Mean test loss of 258 batches: 0.35119806945526094.
[ Wed Sep 28 08:46:09 2022 ] 	Top1: 90.06%
[ Wed Sep 28 08:46:09 2022 ] 	Top5: 98.28%
[ Wed Sep 28 08:46:09 2022 ] Training epoch: 107
[ Wed Sep 28 08:49:19 2022 ] 	Mean training loss: 0.0768. loss2: 0.0000. Mean training acc: 98.10%.
[ Wed Sep 28 08:49:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:49:19 2022 ] Eval epoch: 107
[ Wed Sep 28 08:49:49 2022 ] 	Mean test loss of 258 batches: 0.35172358263543874.
[ Wed Sep 28 08:49:49 2022 ] 	Top1: 90.10%
[ Wed Sep 28 08:49:49 2022 ] 	Top5: 98.26%
[ Wed Sep 28 08:49:49 2022 ] Training epoch: 108
[ Wed Sep 28 08:53:00 2022 ] 	Mean training loss: 0.0735. loss2: 0.0000. Mean training acc: 98.22%.
[ Wed Sep 28 08:53:00 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:53:00 2022 ] Eval epoch: 108
[ Wed Sep 28 08:53:29 2022 ] 	Mean test loss of 258 batches: 0.3538521278591812.
[ Wed Sep 28 08:53:30 2022 ] 	Top1: 89.80%
[ Wed Sep 28 08:53:30 2022 ] 	Top5: 98.25%
[ Wed Sep 28 08:53:30 2022 ] Training epoch: 109
[ Wed Sep 28 08:56:39 2022 ] 	Mean training loss: 0.0734. loss2: 0.0000. Mean training acc: 98.18%.
[ Wed Sep 28 08:56:39 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:56:39 2022 ] Eval epoch: 109
[ Wed Sep 28 08:57:09 2022 ] 	Mean test loss of 258 batches: 0.35225954649666713.
[ Wed Sep 28 08:57:10 2022 ] 	Top1: 90.10%
[ Wed Sep 28 08:57:10 2022 ] 	Top5: 98.30%
[ Wed Sep 28 08:57:10 2022 ] Training epoch: 110
[ Wed Sep 28 09:00:18 2022 ] 	Mean training loss: 0.0691. loss2: 0.0000. Mean training acc: 98.36%.
[ Wed Sep 28 09:00:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 09:00:18 2022 ] Eval epoch: 110
[ Wed Sep 28 09:00:48 2022 ] 	Mean test loss of 258 batches: 0.3549836320593202.
[ Wed Sep 28 09:00:48 2022 ] 	Top1: 89.98%
[ Wed Sep 28 09:00:48 2022 ] 	Top5: 98.19%
[ Wed Sep 28 09:01:18 2022 ] Best accuracy: 0.9013768423606477
[ Wed Sep 28 09:01:18 2022 ] Epoch number: 104
[ Wed Sep 28 09:01:18 2022 ] Model name: work_dir/ntu60/csub/fc_joint
[ Wed Sep 28 09:01:18 2022 ] Model total number of params: 2082097
[ Wed Sep 28 09:01:18 2022 ] Weight decay: 0.0004
[ Wed Sep 28 09:01:18 2022 ] Base LR: 0.1
[ Wed Sep 28 09:01:18 2022 ] Batch Size: 64
[ Wed Sep 28 09:01:18 2022 ] Test Batch Size: 64
[ Wed Sep 28 09:01:18 2022 ] seed: 1
